Minneapolis, MN
August 23, 2022
June 26, 2022
June 29, 2022
58
10.18260/1-2--41236
https://peer.asee.org/41236
2357
Stacy Klein-Gardner's career in P-12 STEM education focuses on increasing interest in and participation by females and URMs and teacher professional development. She is an Adjunct Professor of Biomedical Engineering at Vanderbilt University where she serves as the co-PI and co-Director of the NSF-funded Engineering For US All (e4usa) project. She also serves as the co-PI, Lead Engineer, and Director of Partnerships for Youth Engineering Solutions (YES)-Middle School at Pennsylvania State University. Dr. Klein-Gardner formerly served as the chair of the ASEE P12 Commission and the PCEE division. She is a Fellow of the Society.
Gail Lynn Goldberg is a nationally recognized expert in student assessment. After serving for a decade as an assessment specialist for the Maryland State Department of Education, she has spent over twenty-five years working as an independent educational consultant with and for schools (K-12 and post-secondary), districts, states, the U.S. Department of Education, and a number of assessment organizations. Her areas of expertise include (but are not limited to) item/task writing, rubric development, and judgement-based scoring. Dr. Goldberg also conducts professional development activities on such topics as classroom assessment practice, using assessment results to inform instructional practice, and literacy learning across content areas.
Research prior to 2005 found that no single framework existed that could capture the engineering design process fully or well and benchmark each element of the process to a commonly accepted set of referenced artifacts. Compounding the construction of a stepwise, artifact driven framework is that engineering design is typically practiced over time as a complex and iterative process. For both novice and advanced students, learning and applying the design process is often cumulative, with many informal and formal programmatic opportunities to practice essential elements.
The Engineering Design Process Portfolio Scoring Rubric (EDPPSR) was designed to apply to any portfolio that is intended to document an individual or team driven process leading to an original attempt to design a product, process, or method to provide the best and most optimal solution to a genuine and meaningful problem. In essence, the portfolio should be a detailed account or “biography” of a project and the thought processes that inform that project. Besides narrative and explanatory text, entries may include (but need not be limited to) drawings, schematics, photographs, notebook and journal entries, transcripts or summaries of conversations and interviews, and audio/video recordings. Such entries are likely to be necessary in order to convey accurately and completely the complex thought processes behind the planning, implementation, and self-evaluation of the project. The rubric is comprised of four main components, each in turn comprised of three elements. Each element has its own holistic rubric.
The process by which the EDPPSR was created gives evidence of the relevance and representativeness of the rubric and helps to establish validity. The EDPPSR model as originally rendered has a strong theoretical foundation as it has been developed by reference to the literature on the steps of the design process through focus groups and through expert review by teachers, faculty and researchers in performance based, portfolio rubrics and assessments. Using the unified construct validity framework, the EDDPSR’s validity was further established through expert reviewers (experts in engineering design) providing evidence supporting the content relevance and representativeness of the EDPPSR in representing the basic process of engineering design.
This manuscript offers empirical evidence that supports the use of the EDPPSR model to evaluate student design-based projects in a reliable and valid manner. Intra-class correlation coefficients (ICC) were calculated to determine the inter-rater reliability (IRR) of the rubric. Given the small sample size we also examined confidence intervals (95%) to provide a range of values in which the estimate of inter-reliability is likely contained.
Klein-Gardner, S., & Abts, L., & Goldberg, G. (2022, August), The Engineering Design Process Portfolio Scoring Rubric (EDPPSR) – Initial Validity and Reliability (Fundamental) Paper presented at 2022 ASEE Annual Conference & Exposition, Minneapolis, MN. 10.18260/1-2--41236
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2022 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015